Syllable and language model based features for detecting non-scorable tests in spoken language proficiency assessment applications

نویسندگان

  • Angeliki Metallinou
  • Jian Cheng
چکیده

This work introduces new methods for detecting non-scorable tests, i.e., tests that cannot be accurately scored automatically, in educational applications of spoken language proficiency assessment. Those include cases of unreliable automatic speech recognition (ASR), often because of noisy, off-topic, foreign or unintelligible speech. We examine features that estimate signalderived syllable information and compare it with ASR results in order to detect responses with problematic recognition. Further, we explore the usefulness of language model based features, both for language models that are highly constrained to the spoken task, and for task independent phoneme language models. We validate our methods on a challenging dataset of young English language learners (ELLs) interacting with an automatic spoken assessment system. Our proposed methods achieve comparable performance compared to existing non-scorable detection approaches, and lead to a 21% relative performance increase when combined with existing approaches.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Model for Detecting of Persian Rumors based on the Analysis of Contextual Features in the Content of Social Networks

The rumor is a collective attempt to interpret a vague but attractive situation by using the power of words. Therefore, identifying the rumor language can be helpful in identifying it. The previous research has focused more on the contextual information to reply tweets and less on the content features of the original rumor to address the rumor detection problem. Most of the studies have been in...

متن کامل

The Effect of Variations in Integrated Writing Tasks and Proficiency Level on Features of Written Discourse Generated by Iranian EFL Learners

In recent years, a number of large-scale writing assessments (e.g., TOEFL iBT) have employed integrated writing tests to measure test takers’ academic writing ability. Using a quantitative method, the current study examined how written textual features and use of source material(s) varied across two types of text-based integrated writing tasks (i.e., listening-to-write vs. reading-to-write) and...

متن کامل

Automatic scoring of non-native children's spoken language proficiency

In this study, we aim to automatically score the spoken responses from an international English assessment targeted to non-native English-speaking children aged 8 years and above. In contrast to most previous studies focusing on scoring of adult non-native English speech, we explored automated scoring of child language assessment. We developed automated scoring models based on a large set of fe...

متن کامل

Non-scorable Response Detection for Automated Speaking Proficiency Assessment

We present a method that filters out nonscorable (NS) responses, such as responses with a technical difficulty, in an automated speaking proficiency assessment system. The assessment system described in this study first filters out the non-scorable responses and then predicts a proficiency score using a scoring model for the remaining responses. The data were collected from non-native speakers ...

متن کامل

English and Persian Undergraduate Students’ Perceptions of the Construct-(ir)Relevance of Language Proficiency in the Assessment of Literary Competence

Of the many dilemmas facing the assessment of literary competence, one is the extent to which language should constitute part of the target construct intended to be measured. Some argue for the construct-irrelevance of language and hence recommend that it be eliminated or minimized in favor of an exclusive focus on literary competence. In practice, this does not seem to be the case, as language...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014